منابع مشابه
A note on sparse least-squares regression
We compute a sparse solution to the classical least-squares problem minx ‖Ax−b‖2, where A is an arbitrary matrix. We describe a novel algorithm for this sparse least-squares problem. The algorithm operates as follows: first, it selects columns from A, and then solves a least-squares problem only with the selected columns. The column selection algorithm that we use is known to perform well for t...
متن کاملLeast-Squares Regression on Sparse Spaces
Another application is when one uses random projections to project each input vector into a lower dimensional space, and then train a predictor in the new compressed space (compression on the feature space). As is typical of dimensionality reduction techniques, this will reduce the variance of most predictors at the expense of introducing some bias. Random projections on the feature space, alon...
متن کاملCompressed Least-Squares Regression on Sparse Spaces
Recent advances in the area of compressed sensing suggest that it is possible to reconstruct high-dimensional sparse signals from a small number of random projections. Domains in which the sparsity assumption is applicable also offer many interesting large-scale machine learning prediction tasks. It is therefore important to study the effect of random projections as a dimensionality reduction m...
متن کاملPEDOMODELS FITTING WITH FUZZY LEAST SQUARES REGRESSION
Pedomodels have become a popular topic in soil science and environmentalresearch. They are predictive functions of certain soil properties based on other easily orcheaply measured properties. The common method for fitting pedomodels is to use classicalregression analysis, based on the assumptions of data crispness and deterministic relationsamong variables. In modeling natural systems such as s...
متن کاملFast Sparse Least-Squares Regression with Non-Asymptotic Guarantees
In this paper, we study a fast approximation method for large-scale highdimensional sparse least-squares regression problem by exploiting the JohnsonLindenstrauss (JL) transforms, which embed a set of high-dimensional vectors into a low-dimensional space. In particular, we propose to apply the JL transforms to the data matrix and the target vector and then to solve a sparse least-squares proble...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Information Processing Letters
سال: 2014
ISSN: 0020-0190
DOI: 10.1016/j.ipl.2013.11.011